Jeffreys priors for mixture estimation: Properties and alternatives

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalization of Jeffreys’ Divergence Based Priors for Bayesian Hypothesis testing

In this paper we introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence based (DB) priors. DB priors have simple forms and desirable properties, like information (finite sample) consistency; often, they are similar to other existing proposals like the intrinsic priors; moreov...

متن کامل

Default priors for density estimation with mixture models

The infinite mixture of normals model has become a popular method for density estimation problems. This paper proposes an alternative hierarchical model that leads to hyperparameters that can be interpreted as the location, scale and smoothness of the density. The priors on other parts of the model have little effect on the density estimates and can be given default choices. Automatic Bayesian ...

متن کامل

Density Estimation by Mixture Models with Smoothing Priors

In the statistical approach for self-organizing maps (SOMs), learning is regarded as an estimation algorithm for a gaussian mixture model with a gaussian smoothing prior on the centroid parameters. The values of the hyperparameters and the topological structure are selected on the basis of a statistical principle. However, since the component selection probabilities are fixed to a common value,...

متن کامل

Inference in Two-Piece Location-Scale Models with Jeffreys Priors

This paper addresses the use of Jeffreys priors in the context of univariate threeparameter location-scale models, where skewness is introduced by differing scale parameters either side of the location. We focus on various commonly used parameterizations for these models. Jeffreys priors are shown not to allow for posterior inference in the wide and practically relevant class of distributions o...

متن کامل

Flexible Priors for Infinite Mixture Models

Most infinite mixture models in the current literature are based on the Dirichlet process prior. This prior on partitions implies a very specific (a priori) distribution on cluster sizes. A slightly more general prior known as the Pitman-Yor process prior generalizes this to a two-parameter family. The latter is the most general exchangeable partition probability function (EPPF) as defined by P...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Statistics & Data Analysis

سال: 2018

ISSN: 0167-9473

DOI: 10.1016/j.csda.2017.12.005